AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Multilingual Foundation Model

# Multilingual Foundation Model

Decapoda Research Llama 7b Hf
Other
LLaMA-7B is an efficient foundational language model developed by Meta AI, based on the Transformer architecture with 7 billion parameters, suitable for natural language processing research.
Large Language Model Transformers
D
linhvu
860
8
Decapoda Research Llama 7B Hf
Other
LLaMA is an efficient foundational language model developed by Meta AI, available in parameter sizes ranging from 7B to 65B. Based on the Transformer architecture, it is suitable for various natural language processing tasks.
Large Language Model Transformers
D
baffo32
12.29k
63
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase